Fast Generalized Conditional Gradient Method with Applications to Matrix Recovery Problems

نویسندگان

  • Dan Garber
  • Shoham Sabach
  • Atara Kaplan
چکیده

Motivated by matrix recovery problems such as Robust Principal Component Analysis, we consider a general optimization problem of minimizing a smooth and strongly convex loss applied to the sum of two blocks of variables, where each block of variables is constrained or regularized individually. We present a novel Generalized Conditional Gradient method which is able to leverage the special structure of the problem to obtain faster convergence rates than those attainable via standard methods, under a variety of interesting assumptions. In particular, our method is appealing for matrix problems in which one of the blocks corresponds to a low-rank matrix and avoiding prohibitive full-rank singular value decompositions, which are required by most standard methods, is most desirable. Importantly, while our initial motivation comes from problems which originated in statistics, our analysis does not impose any statistical assumptions on the data.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Equivalence of a Generalized Conditional Gradient Method and the Method of Surrogate Functionals

This article combines techniques from two fields of applied mathematics: optimization theory and inverse problems. We investigate the equivalence of the classical conditional gradient method and the surrogate method, which has been recently proposed for solving inverse problems. The method of surrogate functionals aims at the solution of non-quadratic minimization problems where the solution is...

متن کامل

A generalized conditional gradient method for nonlinear operator equations

The intention of this paper is to show the applicability of a generalized conditional gradient method for the minimization of Tikhonov-type functionals, which occur in the regularization of nonlinear inverse problems with sparsity constraints. We consider functionals of Tikhonov type where the usual quadratic penalty term is replaced by the pth power of a weighted p-norm. First of all, we analy...

متن کامل

1 Gradient-Based Algorithms with Applications to Signal Recovery Problems

This chapter presents in a self-contained manner recent advances in the design and analysis of gradient-based schemes for specially structured smooth and nonsmooth minimization problems. We focus on the mathematical elements and ideas for building fast gradient-based methods and derive their complexity bounds. Throughout the chapter, the resulting schemes and results are illustrated and applied...

متن کامل

A Class of Nested Iteration Schemes for Generalized Coupled Sylvester Matrix Equation

Global Krylov subspace methods are the most efficient and robust methods to solve generalized coupled Sylvester matrix equation. In this paper, we propose the nested splitting conjugate gradient process for solving this equation. This method has inner and outer iterations, which employs the generalized conjugate gradient method as an inner iteration to approximate each outer iterate, while each...

متن کامل

An accelerated gradient based iterative algorithm for solving systems of coupled generalized Sylvester-transpose matrix equations

‎In this paper‎, ‎an accelerated gradient based iterative algorithm for solving systems of coupled generalized Sylvester-transpose matrix equations is proposed‎. ‎The convergence analysis of the algorithm is investigated‎. ‎We show that the proposed algorithm converges to the exact solution for any initial value under certain assumptions‎. ‎Finally‎, ‎some numerical examples are given to demons...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • CoRR

دوره abs/1802.05581  شماره 

صفحات  -

تاریخ انتشار 2018